1,110 research outputs found

    Deaths attributable to diabetes in the United States: comparison of data sources and estimation approaches

    Full text link
    OBJECTIVE: The goal of this research was to identify the fraction of deaths attributable to diabetes in the United States. RESEARCH DESIGN AND METHODS: We estimated population attributable fractions (PAF) for cohorts aged 30±84 who were surveyed in the National Health Interview Survey (NHIS) between 1997 and 2009 (N = 282,322) and in the National Health and Nutrition Examination Survey (NHANES) between 1999 and 2010 (N = 21,814). Cohort members were followed prospectively for mortality through 2011. We identified diabetes status using self-reported diagnoses in both NHIS and NHANES and using HbA1c in NHANES. Hazard ratios associated with diabetes were estimated using Cox model adjusted for age, sex, race/ethnicity, educational attainment, and smoking status. RESULTS: We found a high degree of consistency between data sets and definitions of diabetes in the hazard ratios, estimates of diabetes prevalence, and estimates of the proportion of deaths attributable to diabetes. The proportion of deaths attributable to diabetes was estimated to be 11.5% using self-reports in NHIS, 11.7% using self-reports in NHANES, and 11.8% using HbA1c in NHANES. Among the sub-groups that we examined, the PAF was highest among obese persons at 19.4%. The proportion of deaths in which diabetes was assigned as the underlying cause of death (3.3±3.7%) severely understated the contribution of diabetes to mortality in the United States. CONCLUSIONS: Diabetes may represent a more prominent factor in American mortality than is commonly appreciated, reinforcing the need for robust population-level interventions aimed at diabetes prevention and care

    Rediscovering Renaissance Recipes: Digital Presentation for a 16th Century Text

    Get PDF
    This project seeks to create a web-based system for working with a French text from 1509, Platine en francoys, which has been transcribed into an XML (Extensible Markup Language)-based file format using the conventions of TEI. Through incorporation of several web technologies such as NodeJS, the application provides a section by section navigation capability that allows versions of the text to be seen in multiple configurations. The options include a side-by-side presentation mode that allows for easy compare/contrast of the original versus a regularized spelling or other variants. A facsimile-focused view is also planned, along with tools leveraging the specialized markup and focused searches on non-recipe text, recipes, and ingredients. It is expected that these features will allow for a deeper understanding of the text as well as function as a foundation for future development work as part of the ongoing cross-disciplinary computing/language collaboration efforts

    Improved classification for compositional data using the α\alpha-transformation

    Get PDF
    In compositional data analysis an observation is a vector containing non-negative values, only the relative sizes of which are considered to be of interest. Without loss of generality, a compositional vector can be taken to be a vector of proportions that sum to one. Data of this type arise in many areas including geology, archaeology, biology, economics and political science. In this paper we investigate methods for classification of compositional data. Our approach centres on the idea of using the α\alpha-transformation to transform the data and then to classify the transformed data via regularised discriminant analysis and the k-nearest neighbours algorithm. Using the α\alpha-transformation generalises two rival approaches in compositional data analysis, one (when α=1\alpha=1) that treats the data as though they were Euclidean, ignoring the compositional constraint, and another (when α=0\alpha=0) that employs Aitchison's centred log-ratio transformation. A numerical study with several real datasets shows that whether using α=1\alpha=1 or α=0\alpha=0 gives better classification performance depends on the dataset, and moreover that using an intermediate value of α\alpha can sometimes give better performance than using either 1 or 0.Comment: This is a 17-page preprint and has been accepted for publication at the Journal of Classificatio

    Impacts of bacterial evolution on host lethality in drosophila

    Get PDF
    Evolution is the process by which species change their genetic traits, such as the pathogenicity of bacteria, over time in response to changes in their environment. Although the genetic mechanisms underlying many evolutionary processes have been revealed, it is still not well understood how opportunistic pathogens, such as Pseudomonas aeruginosa, become virulent. The overall goal of this thesis is to test the Coincidental Evolution Hypothesis, which proposes that the virulence of opportunistic pathogens evolves coincidentally as a by-product of their interaction with their natural predators. I hypothesized that the virulence of ancestral Pseudomonas aeruginosa changes over time if it co-evolves with its natural predator, the amoeba. Specifically, I predicted that evolved Pseudomonas aeruginosa becomes more virulent to survive against the amoeba. To test this hypothesis, I infected Drosophila, the fruit fly, as an alternative host to humans with ancestral and evolved Pseudomonas aeruginosa. Survival analysis showed that the evolved strain of Pseudomonas aeruginosa was more virulent than the ancestral strain. This provides insights into how opportunistic pathogens might evolve and could eventually be used in pharmaceutical research to combat bacterial antibiotic resistance

    A data-based power transformation for compositional data

    Get PDF
    Compositional data analysis is carried out either by neglecting the compositional constraint and applying standard multivariate data analysis, or by transforming the data using the logs of the ratios of the components. In this work we examine a more general transformation which includes both approaches as special cases. It is a power transformation and involves a single parameter, {\alpha}. The transformation has two equivalent versions. The first is the stay-in-the-simplex version, which is the power transformation as defined by Aitchison in 1986. The second version, which is a linear transformation of the power transformation, is a Box-Cox type transformation. We discuss a parametric way of estimating the value of {\alpha}, which is maximization of its profile likelihood (assuming multivariate normality of the transformed data) and the equivalence between the two versions is exhibited. Other ways include maximization of the correct classification probability in discriminant analysis and maximization of the pseudo R-squared (as defined by Aitchison in 1986) in linear regression. We examine the relationship between the {\alpha}-transformation, the raw data approach and the isometric log-ratio transformation. Furthermore, we also define a suitable family of metrics corresponding to the family of {\alpha}-transformation and consider the corresponding family of Frechet means.Comment: Published in the proceddings of the 4th international workshop on Compositional Data Analysis. http://congress.cimne.com/codawork11/frontal/default.as

    Research and development of triploid brown trout Salmo trutta (Linnaeus, 1758) for use in aquaculture and fisheries management

    Get PDF
    Freshwater sport fisheries contribute substantially to the economies of England and Wales. However, many trout fisheries rely partly or entirely on stocking farmed trout to maintain catches within freshwater fisheries. Farmed trout often differ genetically from their wild counterparts and wild trout could be at risk of reduced fitness due to interbreeding or competition with farmed fish. Therefore, to protect remaining wild brown trout (Salmo trutta L) populations and as a conservation measure, stocking policy has changed. Legislation introduced by the Environment Agency (EA, 2009) will now only give consent to stocking of rivers and some stillwaters with sterile, all-female triploid brown trout. There are reliable triploidy induction protocols for some other commercially important salmonid species however; there is limited knowledge on triploid induction in brown trout. Previously, triploid brown trout have been produced by heat shocks although reduced survivals were obtained suggesting that an optimised heat shock had not been identified, or that heat shock gives less consistent success than hydrostatic pressure shock (HP), which is now recognised as a more reliable technique to produce triploid fish. Thus the overall aim of this thesis was to conduct novel research to support the aquaculture and freshwater fisheries sector within the United Kingdom by optimising the production and furthering the knowledge of triploid brown trout. Firstly, this PhD project investigated an optimised triploidy induction protocol using hydrostatic pressure (Chapter 2). In order to produce an optimised hydrostatic pressure induction protocol three experiments were conducted to (1) determine the optimal timing of HP shock application post-fertilisation, (2) define optimal pressure intensity and duration of the HP shock and (3) study the effect of temperature (6-12 °C) on triploid yields. Results indicated high survival to yolk sac absorption stage (69.2 - 93.6 %) and high triploid yields (82.5 - 100 %) from the range of treatments applied. Furthermore, no significant differences in triploid rates were shown when shock timings and durations were adjusted according to the temperature used. In all treatments deformity prevalence remained low during incubation (<1.8 %) up to yolk sac absorption (~550 degree days post hatch). Overall, this study indicated that the optimised pressure shock for the induction of triploidy in brown trout delivering high survival and 100 % triploid rate (a prerequisite to brown trout restocking) is a shock with a magnitude of 689 Bar applied at 300 Centigrade Temperature Minutes (CTM) for 50 CTM duration. Regarding the assessment of triploid status, the second experimental chapter tested the accuracy and efficacy of three ploidy verification techniques (Chapter 3). Techniques studied were erythrocyte nuclei measurements (Image analysis), flow cytometry (Becton Dickinson Facscalibur flow cytometer) and DNA profiling (22 polymorphic microsatellite loci) to assess the effectiveness of triploidy induction in brown trout. Results indicated the validity of using erythrocyte indices major nuclear axis measurements, flow cytometric DNA distributions expressed as relative fluorescence (FL2-Area), and polymorphic microsatellite loci (Ssa410UOS, SSa197, Str2 and SsaD48) for assessing ploidy status in brown trout. Accuracy of each technique was assessed and indicated that all techniques correctly identified ploidy level indicating 100 % triploid rate for that commercial batch of brown trout. These techniques may be utilised within aquaculture and freshwater fisheries to ensure compliance with the legislation introduced by the EA. As a result of the legislation introduced by the Environment Agency triploid brown trout will freely interact with diploid trout therefore there is a need to assess feeding response and behavioural differences between diploid and triploid trout prior to release. Therefore, in the third experimental chapter (Chapter 4) diploid and triploid brown trout were acclimated for six weeks on two feeding regimes (floating/sinking pellet). Thereafter, aggression and surface feeding response was compared between pairs of all diploid, diploid and triploid and all triploid brown trout in a semi natural stream (flume). In each pairwise matching, fish of similar size were placed in allopatry and rank determined by the total number of aggressive interactions initiated. Dominant individuals initiated more aggression than subordinates, spent more time defending a territory and positioned themselves closer to the food source (Gammarus pulex) whereas subordinates occupied the peripheries. When ploidy was considered, diploid trout were more aggressive than triploid, and dominated their siblings when placed in pairwise matchings. However, surface feeding did not differ statistically between ploidy irrespective of feeding regime. Triploids adopted a sneak feeding strategy while diploids expended more time defending a territory. In addition, an assessment of whether triploids exhibited a similar social dominance to diploids when placed in allopatry was conducted. Although aggression was lower in triploid pairs than in the diploid/triploid pairs, a dominance hierarchy was observed between individuals of the same ploidy. Dominant triploid fish were more aggressive and consumed more feed items than subordinate individuals. Subordinate fish displayed a darker colour index than dominant fish suggesting increased stress levels. However, dominant triploid fish seemed more tolerant of subordinate individuals and did not display the same degree of invasive aggression as observed in the diploid/diploid or diploid/triploid matchings. These novel findings suggest that sterile triploid brown trout feed similarly but are less aggressive than diploid trout and therefore may provide freshwater fishery managers an alternative to stocking diploid brown trout. In addition to research at the applied level in triploid brown trout, this thesis also examined the fundamental physiological effects of ploidy in response to temperature regime. Triploid salmonids have been shown to differ in their tolerance to environmental temperature. Therefore the fourth experimental chapter (Chapter 5) investigated whether temperature tolerance affected feed intake and exercise recovery. Diploid and triploid brown trout were exposed to an incremental temperature challenge (10 and 19 °C) and subsequent survival and feed intake rates were monitored. Triploids took longer to acclimate to the increase in temperature however feed intake were significantly greater in triploids at high temperature. In a follow on study, we investigated post-exercise recovery processes under each temperature regime (10 and 19 °C). Exhaustion was induced by 10 minutes of forced swimming, with subsequent haematological responses measured to determine the magnitude of recovery from exercise. Plasma parameters (alkaline phosphatase, aspartate aminotransferase, calcium, cholesterol, triglycerides, phosphorous, total protein, lactate, glucose, pH, magnesium, osmolality, potassium, sodium, chloride, lactate dehydrogenase) were measured for each ploidy. Basal samples were taken prior to exercise and then at: 1; 4, and 24 hours post-exercise. Contrary to previous studies, there was no triploid mortality during or after the exercise at either temperature. Although diploid and triploid brown trout responded metabolically to the exercise, the magnitude of the response was affected by ploidy and temperature. In particular, triploids had higher levels of plasma lactate, osmolality, and lower pH than diploids at 1 hour post exhaustive exercise. By 4 hours post-exercise plasma parameters analysed had returned to near basal levels. It was evident that the magnitude of the physiological disturbance post-exercise was greater in triploids than diploids at 19 °C. This may have implications where catch and release is practiced on freshwater fisheries. Overall, this work aimed to develop and/or refine current industry induction and assessment protocols while better understand the behaviour and physiology of diploid and triploid brown trout. The knowledge gained from this work provides aquaculture and freshwater fisheries with an optimised protocol, which delivers 100 % triploid rates and profitability without compromising farmed trout welfare, thus ultimately leading towards a more sustainable brown trout industry within the United Kingdom

    Logged In And Connected? A Quantitative Analysis Of Online Course Use And Alumni Giving

    Get PDF
    Business and education stand out as two of the most prominent sectors affected by the rapid expansion of the Internet. A significant body of literature within business has been devoted to developing positive e-commerce exchanges that develop customer loyalty. While online education grows each year, the long-term significance of online education to develop a loyal alumni base has yet to be studied. Findings in marketing research literature on trust and loyalty provide exceptional significance for online education, especially in a tight economic climate that has forced colleges and universities to rely on alumni giving for operational support. This study examines the significance of online course use as a predictive variable for alumni giving at one medium sized, private liberal arts university using 3,450 students. The results show a negative correlation between the online classes and alumni giving, among other predictive variables used in alumni giving. The findings provide foundational insights for education administrators and fundraisers involved in online education and its effect on alumni giving

    Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Get PDF
    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the response. The results instead showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists. Because the bulk of resonance problems are due to the "clean" excitations, a 10% underprediction is not necessarily a problem, especially since the average response in the transient is similar to the frequency response result, and so in a realistic finite life calculation, the life would be same. However, in the rare cases when the "messy" excitations harmonics are identified as the source of potential resonance concerns, this research does indicate that frequency response analysis is inadequate for accurate characterization of blade structural capability

    Estimating Smoking-attributable Mortality in the United States

    Get PDF
    Tobacco is the largest single cause of premature death in the developed world. Two methods of estimating the number of deaths attributable to smoking use mortality from lung cancer as an indicator of the damage from smoking. We reestimate the coefficients of one of these, the Preston/Glei/Wilmoth model, using recent data from U.S. states. We calculate smoking attributable fractions for the 50 states and the U.S. as a whole in 2000 and 2004. We estimate that 21% of adult deaths among men and 17% among women were attributable to smoking in 2004. Across states, attributable fractions range from 11% to 30% among men and from 7% to 23% among women. Smoking related mortality also explains as much as 60% of the mortality disadvantage of Southern states. At the national level, our estimates are in close agreement with those of the Centers for Disease Control (CDC) and Preston/Glei/Wilmoth, particularly for men. But we find greater variability by state than does CDC. We suggest that our coefficients are suitable for calculating smoking-attributable mortality in contexts with relatively mature cigarette smoking epidemics
    corecore